410 research outputs found

    The Louisiana Sciences Education Act\u27s impact on high school science

    Get PDF
    Over the last two decades, extensive research has been conducted concerning the legality and effectiveness of teaching opposing viewpoints on controversial topics in science education. However, one of the most important aspects of this dilemma has been disregarded: the effect it has on individual teachers in their unique environments. The purpose of this research was to analyze teacher’s comprehension of recent state legislation as well as how it impacts their instruction. This quantitative approach took place through an online survey of secondary science teachers in biology, which focuses on their teaching experience, understanding of the Louisiana Science Education Act of 2008, and personal views on how evolution should be taught. The research found that about one-third (31%) of biology teachers in Louisiana thought creationism should be taught alongside evolution. About 3 in 4 biology teachers knew what the Louisiana Science Education Act was, and 1 in 10 said it had an influence on their instruction. These data support the hypothesis that recent state legislation has little impact on the daily instruction of science educators

    A GPU based X-Engine for the MeerKAT Radio Telescope

    Get PDF
    The correlator is a key component of the digital backend of a modern radio telescope array. The 64 antenna MeerKAT telescope has an FX architecture correlator consisting of 64 F-Engines and 256 X-Engines. These F- and X-Engines are all hosted on 128 custom designed FPGA processing boards. This custom board is known as a SKARAB. One SKARAB X-Engine board hosts four logical X-Engines. This SKARAB ingests data at 27.2 Gbps over a 40 GbE connection. It correlates this data in real time. GPU technology has improved significantly since SKARAB was designed. GPUs are now becoming viable alternatives to FPGAs in high performance streaming applications. The objective of this dissertation is to investigate how to build a GPU drop-in replacement X-Engine for MeerKAT and to compare this implementation to a SKARAB X-Engine. This includes the construction and analysis of a prototype GPU X-Engine. The 40 GbE ingest, GPU correlation algorithm and the software pipeline framework that links these two together were identified as the three main sub-systems to focus on in this dissertation. A number of different tools implementing these sub-systems were examined with the most suitable ones being chosen for the prototype. A prototype dual socket system was built that could process the equivalent of two SKARABs worth of X-Engine data. This prototype has two 40 GbE Mellanox NICS running the SPEAD2 library and a single Nvidia GeForce 1080Ti GPU running the xGPU library. A custom pipeline framework built on top of the Intel Threaded Building Blocks (TBB) library was designed to facilitate the ow of data between these sub-systems. The prototype system was compared to two SKARABs. For an equivalent amount of processing, the GPU X-Engine cost R143 000 while the two SKARABs cost R490 000. The power consumption of the GPU X-Engine was more than twice that of the SKARABs (400W compared 180W), while only requiring half as much rack space. GPUs as X-Engines were found to be more suitable than FPGAs when cost and density are the main priorities. When power consumption is the priority, then FPGAs should be used. When running eight logical X-Engines, 85% of the prototype's CPU cores were used while only 75% of the GPU's compute capacity was utilised. The main bottleneck on the GPU X-Engine was on the CPU side of the server. This report suggests that the next iteration of the system should offload some CPU side processing to the GPU and double the number of 40 GbE ports. This could potentially double the system throughput. When considering methods to improve this system, an FPGA/GPU hybrid X-Engine concept was developed that would combine the power saving advantage of FPGAs and the low cost to compute ratio of GPUs

    The Shining School Upon the Hill: Teacher Subjectivity in a Successful Charter School

    Get PDF
    This dissertation adds to the growing body of literature on charter school reform. Through the use of double insight, this paper details the tensions between school structures and teacher experience at a “successful” urban charter school. How do teachers construct their subjectivities in relation to a charter school’s mission and guiding philosophy? What are the inter-actions between these teacher biographies and the school’s prominent structures? This paper problematizes common discourse on charters, often reduced to identifying schools as either “good” or “bad,” to contribute to a more nuanced discussion between charter advocates and opponents. This research utilizes a theoretical framework (poststructuralism) and method (ethnography) often neglected in research on charter schools. My poststructural lens is particularly informed through Foucault’s notion of power and Deleuze’s notion of norming. By focusing on two teachers, Maria and Barbara, I constructed my research on a foundation of teacher voice. Furthermore, I coded and themed teacher interviews, observational fieldnotes, internal school reports, and state and federal policy documents. Four structures support the charter’s mission as a technology-infused school preparing students for the modern workforce: democratization, continuous professional development, community building, and inter-disciplinary learning and teaching. These four structures inter-acted with Barbara and Maria’s subjectivities throughout my time at Pennsylvania High Charter School, resulting in three tensions: professional expectations, and individualization, and surveillance. These seven themes and two subjectivities created a network of relationships, useful for understanding how teachers navigate the expectations of a “successful” charter school in relation to their own understanding of “effective” curriculum and pedagogy. Teacher subjectivity is central to this research project for two reasons. First, teacher voice has largely been omitted from previous research on charter schools. Second, teacher narratives can emphasize the disconnect between theory, such as formal school structures, and practice, or the lives and experiences of teachers in their classrooms. By focusing on this tension of praxis, a more nuanced discussion of charter schools is made possible

    Monitoring in a grid cluster

    Get PDF
    The monitoring of a grid cluster (or of any piece of reasonably scaled IT infrastructure) is a key element in the robust and consistent running of that site. There are several factors which are important to the selection of a useful monitoring framework, which include ease of use, reliability, data input and output. It is critical that data can be drawn from different instrumentation packages and collected in the framework to allow for a uniform view of the running of a site. It is also very useful to allow different views and transformations of this data to allow its manipulation for different purposes, perhaps unknown at the initial time of installation. In this context, we present the findings of an investigation of the Graphite monitoring framework and its use at the ScotGrid Glasgow site. In particular, we examine the messaging system used by the framework and means to extract data from different tools, including the existing framework Ganglia which is in use at many sites, in addition to adapting and parsing data streams from external monitoring frameworks and websites

    A voyage to Arcturus: a model for automated management of a WLCG Tier-2 facility

    Get PDF
    With the current trend towards "On Demand Computing" in big data environments it is crucial that the deployment of services and resources becomes increasingly automated. Deployment based on cloud platforms is available for large scale data centre environments but these solutions can be too complex and heavyweight for smaller, resource constrained WLCG Tier-2 sites. Along with a greater desire for bespoke monitoring and collection of Grid related metrics, a more lightweight and modular approach is desired. In this paper we present a model for a lightweight automated framework which can be use to build WLCG grid sites, based on "off the shelf" software components. As part of the research into an automation framework the use of both IPMI and SNMP for physical device management will be included, as well as the use of SNMP as a monitoring/data sampling layer such that more comprehensive decision making can take place and potentially be automated. This could lead to reduced down times and better performance as services are recognised to be in a non-functional state by autonomous systems

    Mapping disease data: A usability test of an internet based system of disease status disclosure

    Get PDF
    Disease maps are important tools in the management of disease. By communicating risk, disease maps can help raise awareness of disease and encourage farmers and veterinarians to employ best practice to eliminate the spread of disease. However, despite the importance of disease maps in communicating risk and the existence of various online disease maps, there are few studies that explicitly examine their usability. Where disease maps are complicated to use, it seems that they are unlikely to be used effectively. The paper outlines an attempt to create an open access, online, searchable map of incidents of bovine tuberculosis in England and Wales, and analyzes its usability among veterinarians. The paper describes the process of creating the map before describing the results of a series of usability trials. Results show the map to score highly on different measures of usability. However, the trials also revealed a number of social and technical limitations and challenges facing the use of online disease maps, including reputational dangers, role confusion, data accuracy, and data representation. The paper considers the challenges facing disease maps and their potential role in designing new methodologies to evaluate the effectiveness of disease prevention initiatives

    An introduction to communicating science

    Get PDF
    It is becoming increasingly recognised that students in Higher Education must acquire the skills necessary for professional and personal development, as well as for academic progress. The media have recently focused on the issue of declining public interest in the sciences and the lack of accurate reporting of science. We have developed a new programme, which endeavours to address both issues involving a three day intensive course covering writing, TV and radio. In addition to the targeted activities of learning the skills of science communication, the programme encourages partnerships, and exploits the resources and expertise available from various institutions. The undertaking of this type of programme is not limited to the acquisition of time slots in a studio such as Bush House. Most university campuses are now home to their own recording studios and even have television facilities. However, the programme requires only a video camera and audio recording equipment. The success of this science communication module and oftwo others run by MOAC and CBC (Team Development and Decision-making and Leadership) has encouraged us to develop a complete postgraduate certificate in transferable skills. We anticipate the certificate will be a valuable vehicle for consolidating and enhancing the training discussed in this article

    Prediction of human intestinal absorption using micellar liquid chromatography with an aminopropyl stationary phase

    Get PDF
    The extent of human intestinal absorption (HIA) for a drug is considered to be an important pharmacokinetic parameter which must be determined for orally administered drugs. Traditional experimental methods relied upon animal testing and are renowned for being time consuming, expensive as well as being ethically unfavourable. As a result, developing alternative methods to evaluate a drug's pharmacokinetics is crucial. Micellar liquid chromatography (MLC) is considered to be one of these methods that can replace the use of animals in prediction of HIA. In this study, the combination of an aminopropyl column with the biosurfactant sodium deoxycholate (NaDC) bile salt were used in the experimental determination of micelle-water partition coefficients (log Pmw) for a group of compounds. Multiple linear regression (MLR) was then used for the prediction of HIA using the experimentally determined log Pmw along with other molecular descriptors leading to the construction of a model equation of R2= 85 % and a prediction power represented by R2Pred. =72 %. The use of MLC with an aminopropyl column in combination with NaDC was found to be a good method for the prediction of human intestinal absorption, providing data for a far wider range of compounds compared with previous studies
    • …
    corecore